Dictator functions maximize mutual information

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Two Dictator Functions Maximize Mutual Information

Let (X,Y) denote n independent, identically distributed copies of two arbitrarily correlated Rademacher random variables (X,Y) on {−1, 1}. We prove that the inequality I (f(X); g(Y)) ≤ I (X;Y) holds for any two Boolean functions: f, g : {−1, 1} → {−1, 1} (I ( · ; ·) denotes mutual information.) We further show that equality in general is achieved only by the dictator functions: f(x) = ±g(x) = ±...

متن کامل

Canalizing Boolean Functions Maximize the Mutual Information

The ability of information processing in biologically motivated Boolean networks is of interest in recent information theoretic research. One measure to quantify this ability is the well known mutual information. Using Fourier analysis we show that canalizing functions maximize the mutual information between an input variable and the outcome of the function. We proof our result for Boolean func...

متن کامل

Hard Clusters Maximize Mutual Information

In this paper, we investigate mutual information as a cost function for clustering, and show in which cases hard, i.e., deterministic, clusters are optimal. Using convexity properties of mutual information, we show that certain formulations of the information bottleneck problem are solved by hard clusters. Similarly, hard clusters are optimal for the information-theoretic co-clustering problem ...

متن کامل

Comments and Corrections Comments on “Canalizing Boolean Functions Maximize Mutual Information”

In their recent paper “Canalizing Boolean Functions Maximize Mutual Information,” Klotz et al. argued that canalizing Boolean functions maximize certain mutual informations by an argument involving Fourier analysis on the hypercube. This note supplies short new proofs of their results based on a coupling argument and also clarifies a point on the necessity of considering randomized functions.

متن کامل

Mutual Information Functions Versus Correlation Functions

This paper studies one application of mutual information to symbolic sequence: the mutual information function M(d). This function is compared with the more frequently used correlation function (d). An exact relation between M(d) and (d) is derived for binary sequences. For sequences with more than two symbols, no such general relation exists; in particular, (d) = 0 may or may not lead to M(d) ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Annals of Applied Probability

سال: 2018

ISSN: 1050-5164

DOI: 10.1214/18-aap1384